News Archive

Using Machine Learning to Create More Capable Capacitors

Georgia Tech Researchers Use SDSC’s Comet and TACC’s Stampede2 to Speed Design

Published June 27, 2019

[Enlarge] Scientists at Georgia Tech are using machine learning with supercomputers to analyze the electronic structure of materials to ultimately find ways to build more capable capacitors. (Left) Density functional theory (DFT) charge density of a molecular dynamics snapshot of a benzene. (Right) Charge density difference between machine learning prediction and DFT for the same benzene structure.  Credit: Rampi Ramprasad, Georgia Tech

By Jorge Salazar, TACC Communications and Jan Zverina, SDSC Communications

Capacitors, given their high energy output and recharging speed, could play a major role in powering the machines of the future, from electric cars to cell phones. However, the biggest hurdle for capacitors as energy storage devices is that they store much less energy than a similar-sized battery.

Researchers at Georgia Institute of Technology (Georgia Tech) are tackling that problem in a novel way by using supercomputers and machine learning techniques to ultimately find ways to build more capable capacitors.

The method was described in Nature Partner Journal’s Computational Materials, published in February 2019. The study involved teaching a computer to analyze at the atomic level two materials, aluminum and polyethylene, which are used to make some capacitors.

The researchers focused on finding a way to more quickly analyze the electronic structure of the capacitor materials, looking for features that could affect performance. “The electronics industry wants to know the electronic properties and structure of all of the materials they use to produce devices, including capacitors,” said Rampi Ramprasad, a professor in Georgia Tech’s School of Materials Science and Engineering.

For example, polyethylene is a very good insulator with a large band gap, the energy range forbidden to electrical charge carriers. But if it has a defect, unwanted charge carriers are allowed into the band gap, reducing efficiency, he said.

“In order to understand where the defects are and what role they play, we need to compute the entire atomic structure, something that so far has been extremely difficult,” said Ramprasad. “The current method of analyzing those materials using quantum mechanics is so slow that it limits how much analysis can be performed at any given time.”

Ramprasad and his colleagues used machine learning to help develop new materials. Here they used a sample of data created from a quantum mechanical analysis of aluminum and polyethylene as an input to teach a powerful computer how to simulate that analysis.

Analyzing the electronic structure of a material with quantum mechanics involves solving the Kohn-Sham equation of density functional theory, which generates data on wave functions and energy levels. That data is then used to compute the total potential energy of the system and atomic forces.

PR201906027_capacitors2.jpg

Overview of the process used to generate surrogate models for the charge density and density of states. The first step entails the generation of the training dataset by sampling random snapshots of molecular dynamics trajectories. First-principles calculations were then performed on these systems (shown in Figure S1) to obtain the training atomic configurations, charge densities, and local density of states. The scalar (S), vector (V), and tensor (T) fingerprint invariants are mapped to the local electronic structure at every grid-point. Credit: Rampi Ramprasad, Georgia Tech

The researchers used the Comet supercomputer at the San Diego Supercomputer Center, an Organized Research Unit of the University of California San Diego for early calculations; and the Stampede2 supercomputer at the Texas Advanced Computing Center, at the University of Texas at Austin, for the later stages of this research. Both systems are funded by the National Science Foundation under multi-year awards.

“In the work leading up to the study, we used Comet extensively for high-throughput polymer electronic property calculation, such as the effect of polymer morphology on the bandage of polymers,” said study co-author Deepak Kamal, a graduate student advised by Ramprasad at the Georgia Tech School of Materials Science and Engineering. “We used Comet because it was fast and efficient at handling large number and quantities of calculations.”

Using the new machine learning method developed by Ramprasad and colleagues produced similar results several orders of magnitude faster than using the conventional technique based on quantum mechanics.

“This unprecedented speed-up in computational capability will allow us to design electronic materials that are superior to what is currently out there,” Ramprasad said. “Basically, we can say, ‘here are defects with this material that will really diminish the efficiency of its electronic structure.’ Once we can address such aspects efficiently, we can better design electronic devices.”

While the study focused on aluminum and polyethylene, machine learning could be used to analyze the electronic structure of a wider range of materials. Beyond analyzing electronic structure, other aspects of material structure now analyzed by quantum mechanics could also be hastened by the machine learning approach, Ramprasad said.

“In part we selected aluminum and polyethylene because they are components of a capacitor,” he explained. “But we also demonstrated that one can use this method for vastly different materials, such as metals that are conductors and polymers that are insulators.”

The faster processing allowed by the machine learning method would also enable researchers to more quickly simulate how modifications to a material will impact its electronic structure, potentially revealing new ways to improve its efficiency. 

Added Kamal: “Supercomputing systems allow high-throughput computing which enables us to create vast databases of knowledge about various material systems. This knowledge can then be utilized to find the best material for a specific application.”

Authors of the study, called ‘Solving the Electronic Structure Problem with Machine Learning,’ was include Anand Chandrasekaran, Deepak Kamal, Rohit Batra, Chiho Kim, and Lihua Chen in addition to Ramprasad and Kamal. All are from Georgia Tech’s School of Materials Science and Engineering. The research was also supported by the Office of Naval Research under grant N0014-17-1-2656.

About SDSC

As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s petascale Comet supercomputer is a key resource within the National Science Foundation’s XSEDE (eXtreme Science and Engineering Discovery Environment) program.